1,982 research outputs found

    Mechanical chest-compression devices: current and future roles

    Get PDF
    Purpose of review: It is recognised that the quality of CPR is an important predictor of outcome from cardiac arrest yet studies consistently demonstrate that the quality of CPR performed in real life is frequently sub-optimal. Mechanical chest compression devices provide an alternative to manual CPR. This review will consider the evidence and current indications for the use of these devices. Recent findings: Physiological and animal data suggest that mechanical chest compression devices are more effective than manual CPR. However there is no high quality evidence showing improved outcomes in humans. There are specific circumstances where it may not be possible to perform manual CPR effectively e.g. during ambulance transport to hospital, en-route to and during cardiac catheterisation, prior to organ donation and during diagnostic imaging where using these devices may be advantageous. Summary: There is insufficient evidence to recommend the routine use of mechanical chest compression devices. There may be specific circumstances when CPR is difficult or impossible where mechanical devices may play an important role in maintaining circulation. There is an urgent need for definitive clinical and cost effectiveness trials to confirm or refute the place of mechanical chest compression devices during resuscitation

    Field D* pathfinding in weighted simplicial complexes

    Get PDF
    Includes abstract.Includes bibliographical references.The development of algorithms to efficiently determine an optimal path through a complex environment is a continuing area of research within Computer Science. When such environments can be represented as a graph, established graph search algorithms, such as Dijkstra’s shortest path and A*, can be used. However, many environments are constructed from a set of regions that do not conform to a discrete graph. The Weighted Region Problem was proposed to address the problem of finding the shortest path through a set of such regions, weighted with values representing the cost of traversing the region. Robust solutions to this problem are computationally expensive since finding shortest paths across a region requires expensive minimisation. Sampling approaches construct graphs by introducing extra points on region edges and connecting them with edges criss-crossing the region. Dijkstra or A* are then applied to compute shortest paths. The connectivity of these graphs is high and such techniques are thus not particularly well suited to environments where the weights and representation frequently change. The Field D* algorithm, by contrast, computes the shortest path across a grid of weighted square cells and has replanning capabilites that cater for environmental changes. However, representing an environment as a weighted grid (an image) is not space-efficient since high resolution is required to produce accurate paths through areas containing features sensitive to noise. In this work, we extend Field D* to weighted simplicial complexes – specifically – triangulations in 2D and tetrahedral meshes in 3D

    Practical Implementation of Machine tool Metrology and Maintenance Management Systems

    Get PDF
    Maximising asset utilisation and minimising downtime and waste are becoming increasingly important to all manufacturing facilities as competition increases and profits decrease. The tools to assist with monitoring these machining processes are becoming more and more in demand. A system designed to fulfil the needs of machine tool operators and supervisors has been developed and its impact on the precision manufacturing industry is being considered. The benefits of implementing this system, compared to traditional methods, will be discussed here

    Formation of the first three gravitational-wave observations through isolated binary evolution

    Get PDF
    During its first 4 months of taking data, Advanced LIGO has detected gravitational waves from two binary black hole mergers, GW150914 and GW151226, along with the statistically less significant binary black hole merger candidate LVT151012. We use our rapid binary population synthesis code COMPAS to show that all three events can be explained by a single evolutionary channel -- classical isolated binary evolution via mass transfer including a common envelope phase. We show all three events could have formed in low-metallicity environments (Z = 0.001) from progenitor binaries with typical total masses ≳160M⊙\gtrsim 160 M_\odot, ≳60M⊙\gtrsim 60 M_\odot and ≳90M⊙\gtrsim 90 M_\odot, for GW150914, GW151226, and LVT151012, respectively.Comment: Published in Nature Communication

    Constellations: A participatory, online application for research collaboration in higher education interdisciplinary courses

    Get PDF
    The research establishes a model for online learning centring on the needs of integrative knowledge practices. Through the metaphor of Constellations, the practice-based research explores the complexities of working within interdisciplinary learning contexts and the potential of tools such as the Folksonomy learning platform for providing necessary conceptual support

    Identification and Reconstruction of Bullets from Multiple X-Rays

    Get PDF
    The 3D shape and position of objects inside the human body are commonly detected using Computed Tomography (CT) scanning. CT is an expensive diagnostic option in economically disadvantaged areas and the radiation dose experienced by the patient is significant. In this dissertation, we present a technique for reconstructing the 3D shape and position of bullets from multiple X-rays. This technique makes us of ubiquitous X-ray equipment and a small number of X-rays to reduce the radiation dose. Our work relies on Image Segmentation and Volume Reconstruction techniques. We present a method for segmenting bullets out of X-rays, based on their signature in intensity profiles. This signature takes the form of a distinct plateau which we model with a number of parameters. This model is used to identify horizontal and vertical line segments within an X-Ray corresponding to a bullet signature. Regions containing confluences of these line segments are selected as bullet candidates. The actual bullet is thresholded out of the region based on a range of intensities occupied by the intensity profiles that contributed to the region. A simple Volume Reconstruction algorithm is implemented that back-projects the silhouettes of bullets obtained from our segmentation technique. This algorithm operates on a 3D voxel volume represented as an octree. The reconstruction is reduced to the 2D case by reconstructing a slice of the voxel volume at a time. We achieve good results for our segmentation algorithm. When compared with a manual segmentation, our algorithm matches 90% of the bullet pixels in nine of the twelve test X-rays. Our reconstruction algorithm produces an acceptable results: It achieves a 70% match for a test case where we compare a simulated bullet with a reconstructed bullet

    Is protocolised weaning that includes early extubation onto non-invasive ventilation more cost effective than protocolised weaning without non-invasive ventilation? Findings from the Breathe Study

    Get PDF
    Background Optimising techniques to wean patients from invasive mechanical ventilation (IMV) remains a key goal of intensive care practice. The use of non-invasive ventilation (NIV) as a weaning strategy (transitioning patients who are difficult to wean to early NIV) may reduce mortality, ventilator-associated pneumonia and intensive care unit (ICU) length of stay. Objectives Our objectives were to determine the cost effectiveness of protocolised weaning, including early extubation onto NIV, compared with weaning without NIV in a UK National Health Service setting. Methods We conducted an economic evaluation alongside a multicentre randomised controlled trial. Patients were randomised to either protocol-directed weaning from mechanical ventilation or ongoing IMV with daily spontaneous breathing trials. The primary efficacy outcome was time to liberation from ventilation. Bivariate regression of costs and quality-adjusted life-years (QALYs) provided estimates of the incremental cost per QALY and incremental net monetary benefit (INMB) overall and for subgroups [presence/absence of chronic obstructive pulmonary disease (COPD) and operative status]. Long-term cost effectiveness was determined through extrapolation of survival curves using flexible parametric modelling. Results NIV was associated with a mean INMB of £620 (US885)(cost−effectivenessthresholdof£20,000perQALY)withacorrespondingprobabilityof58US885) (cost-effectiveness threshold of £20,000 per QALY) with a corresponding probability of 58% that NIV is cost effective. The probability that NIV is cost effective was higher for those with COPD (84%). NIV was cost effective over 5 years, with an estimated incremental cost-effectiveness ratio of £4618 (US6594 per QALY gained). Conclusions The probability of NIV being cost effective relative to weaning without NIV ranged between 57 and 59% overall and between 82 and 87% for the COPD subgroup

    Field D* Pathfinding in Weighted Simplicial Complexes

    Get PDF
    The development of algorithms to efficiently determine an optimal path through a complex environment is a continuing area of research within Computer Science. When such environments can be represented as a graph, established graph search algorithms, such as Dijkstra’s shortest path and A*, can be used. However, many environments are constructed from a set of regions that do not conform to a discrete graph. The Weighted Region Problem was proposed to address the problem of finding the shortest path through a set of such regions, weighted with values representing the cost of traversing the region. Robust solutions to this problem are computationally expensive since finding shortest paths across a region requires expensive minimisation. Sampling approaches construct graphs by introducing extra points on region edges and connecting them with edges criss-crossing the region. Dijkstra or A* are then applied to compute shortest paths. The connectivity of these graphs is high and such techniques are thus not particularly well suited to environments where the weights and representation frequently change. The Field D* algorithm, by contrast, computes the shortest path across a grid of weighted square cells and has replanning capabilites that cater for environmental changes. However, representing an environment as a weighted grid (an image) is not space-efficient since high resolution is required to produce accurate paths through areas containing features sensitive to noise. In this work, we extend Field D* to weighted simplicial complexes – specifically – triangulations in 2D and tetrahedral meshes in 3D. Such representations offer benefits in terms of space over a weighted grid, since fewer triangles can represent polygonal objects with greater accuracy than a large number of grid cells. By exploiting these savings, we show that Triangulated Field D* can produce an equivalent path cost to grid-based Multi-resolution Field D*, using up to an order of magnitude fewer triangles over grid cells and visiting an order of magnitude fewer nodes. Finally, as a practical demonstration of the utility of our formulation, we show how Field D* can be used to approximate a distance field on the nodes of a simplicial complex, and how this distance field can be used to weight the simplicial complex to produce contour-following behaviour by shortest paths computed with Field D*
    • …
    corecore